1. 程式人生 > >kettle-java程式碼執行hive相關ktr時報錯: database type with plugin id [HIVE2] couldn't be found!

kettle-java程式碼執行hive相關ktr時報錯: database type with plugin id [HIVE2] couldn't be found!

1.在java專案中執行hive相關的ktr:

 
		KettleEnvironment.init();

		TransMeta transMeta = new TransMeta("C:\\Users\\lixiang\\Desktop\\hivetorizhi.ktr");
		// 設定執行模式
		TransExecutionConfiguration config = new TransExecutionConfiguration();
		config.setExecutingClustered(true);
		config.setExecutingLocally(false);
		config.setExecutingRemotely(false);
		config.setClusterPosting(true);
		config.setClusterPreparing(true);
		config.setClusterStarting(true);
		TransSplitter transSplitter = Trans.executeClustered(transMeta, config);
		System.out.println(transSplitter.getCarteObjectMap());
		System.out.println(transSplitter.getMaster());
		System.out.println(transSplitter.getSlaves()[0]);
		System.out.println(transSplitter.getSlaves()[1].getStepNames()[0]);
}

2.報錯:

Exception in thread "main" org.pentaho.di.core.exception.KettleXMLException: 
錯誤從XML檔案讀取轉換

錯誤從XML檔案讀取轉換

Unable to load database connection info from XML node

Unable to create new database interface

database type with plugin id [HIVE2] couldn't be found!

	at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:3402)
	at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2758)
	at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2710)
	at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2687)
	at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2667)
	at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2632)
	at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2595)
	at com.kettle.KettleExamle.getRemoteServer(KettleExamle.java:55)
	at com.kettle.KettleExamle.main(KettleExamle.java:140)
Caused by: org.pentaho.di.core.exception.KettleXMLException: 
錯誤從XML檔案讀取轉換

Unable to load database connection info from XML node

Unable to create new database interface

database type with plugin id [HIVE2] couldn't be found!

	at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:3384)
	... 8 more
Caused by: org.pentaho.di.core.exception.KettleXMLException: 
Unable to load database connection info from XML node

Unable to create new database interface

database type with plugin id [HIVE2] couldn't be found!

	at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:1013)
	at org.pentaho.di.trans.TransMeta.loadXML(TransMeta.java:2976)
	... 8 more
Caused by: org.pentaho.di.core.exception.KettleXMLException: 
Unable to create new database interface

database type with plugin id [HIVE2] couldn't be found!

	at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:977)
	... 9 more
Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
database type with plugin id [HIVE2] couldn't be found!

	at org.pentaho.di.core.database.DatabaseMeta.findDatabaseInterface(DatabaseMeta.java:584)
	at org.pentaho.di.core.database.DatabaseMeta.getDatabaseInterface(DatabaseMeta.java:558)
	at org.pentaho.di.core.database.DatabaseMeta.<init>(DatabaseMeta.java:974)
	... 9 more

3.報錯分析:kettle程式在讀取database type的過程中讀不到hive2這個型別。

檢視kettle-core-7.1.0.0-12.jar中的kettle-database-types.xml檔案,此檔案定義了database type ,其中沒hive2.。

<database-types>

   <database-type id="AS/400">                                                        
    <description>AS/400</description>                                                
    <classname>org.pentaho.di.core.database.AS400DatabaseMeta</classname>            
  </database-type>                                                                   

  <database-type id="CACHE">
    <description>Intersystems Cache</description>
    <classname>org.pentaho.di.core.database.CacheDatabaseMeta</classname>
  </database-type>                                                       

  <database-type id="DB2">
    <description>IBM DB2</description>
    <classname>org.pentaho.di.core.database.DB2DatabaseMeta</classname>
  </database-type>                                                     

  <database-type id="DBASE">
    <description>dBase III, IV or 5</description>
    <classname>org.pentaho.di.core.database.DbaseDatabaseMeta</classname>
  </database-type>                                                       

  <database-type id="DERBY">
    <description>Apache Derby</description>
    <classname>org.pentaho.di.core.database.DerbyDatabaseMeta</classname>
  </database-type>                                                       

  <database-type id="EXASOL4">
    <description>Exasol 4</description>
    <classname>org.pentaho.di.core.database.Exasol4DatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="EXTENDB">
    <description>ExtenDB</description>
    <classname>org.pentaho.di.core.database.ExtenDBDatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="FIREBIRD">
    <description>Firebird SQL</description>
    <classname>org.pentaho.di.core.database.FirebirdDatabaseMeta</classname>
  </database-type>                                                          

  <database-type id="GENERIC">
    <description>Generic database</description>
    <classname>org.pentaho.di.core.database.GenericDatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="GREENPLUM">
    <description>Greenplum</description>
    <classname>org.pentaho.di.core.database.GreenplumDatabaseMeta</classname>
  </database-type>                                                           

  <database-type id="INFINIDB">
    <description>Calpont InfiniDB</description>
    <classname>org.pentaho.di.core.database.InfiniDbDatabaseMeta</classname>
  </database-type>
 
  <database-type id="SQLBASE">
    <description>Gupta SQL Base</description>
    <classname>org.pentaho.di.core.database.GuptaDatabaseMeta</classname>
  </database-type>                                                        

  <database-type id="H2">
    <description>H2</description>
    <classname>org.pentaho.di.core.database.H2DatabaseMeta</classname>
  </database-type>                                                    

  <database-type id="HYPERSONIC">
    <description>Hypersonic</description>
    <classname>org.pentaho.di.core.database.HypersonicDatabaseMeta</classname>
  </database-type>                                                            

  <database-type id="INFOBRIGHT">
    <description>Infobright</description>
    <classname>org.pentaho.di.core.database.InfobrightDatabaseMeta</classname>
  </database-type>                                                            

  <database-type id="INFORMIX">
    <description>Informix</description>
    <classname>org.pentaho.di.core.database.InformixDatabaseMeta</classname>
  </database-type>                                                          

  <database-type id="INGRES">
    <description>Ingres</description>
    <classname>org.pentaho.di.core.database.IngresDatabaseMeta</classname>
  </database-type>                                                        

  <database-type id="VECTORWISE">
    <description>Ingres VectorWise</description>
    <classname>org.pentaho.di.core.database.VectorWiseDatabaseMeta</classname>
  </database-type>                                                        

  <database-type id="INTERBASE">
    <description>Borland Interbase</description>
    <classname>org.pentaho.di.core.database.InterbaseDatabaseMeta</classname>
  </database-type>                                                           

  <database-type id="KINGBASEES">
    <description>KingbaseES</description>
    <classname>org.pentaho.di.core.database.KingbaseESDatabaseMeta</classname>
  </database-type>                                                            

  <database-type id="LucidDB">
    <description>LucidDB</description>
    <classname>org.pentaho.di.core.database.LucidDBDatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="MARIADB">
    <description>MariaDB</description>
    <classname>org.pentaho.di.core.database.MariaDBDatabaseMeta</classname>
  </database-type>

  <database-type id="MONETDB">
    <description>MonetDB</description>
    <classname>org.pentaho.di.core.database.MonetDBDatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="MSACCESS">
    <description>MS Access</description>
    <classname>org.pentaho.di.core.database.MSAccessDatabaseMeta</classname>
  </database-type>                                                          

  <database-type id="MSSQLNATIVE">
    <description>MS SQL Server (Native)</description>
    <classname>org.pentaho.di.core.database.MSSQLServerNativeDatabaseMeta</classname>
  </database-type>                                                             

  <database-type id="MSSQL">
    <description>MS SQL Server</description>
    <classname>org.pentaho.di.core.database.MSSQLServerDatabaseMeta</classname>
  </database-type>                                                             

  <database-type id="MYSQL">
    <description>MySQL</description>
    <classname>org.pentaho.di.core.database.MySQLDatabaseMeta</classname>
  </database-type>                                                       

  <database-type id="MONDRIAN">
    <description>Native Mondrian</description>
    <classname>org.pentaho.di.core.database.MondrianNativeDatabaseMeta</classname>
  </database-type>                                                       

  <database-type id="NEOVIEW">
    <description>Neoview</description>
    <classname>org.pentaho.di.core.database.NeoviewDatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="NETEZZA">
    <description>Netezza</description>
    <classname>org.pentaho.di.core.database.NetezzaDatabaseMeta</classname>
  </database-type>                                                         

  <database-type id="ORACLE">
    <description>Oracle</description>
    <classname>org.pentaho.di.core.database.OracleDatabaseMeta</classname>
  </database-type>                                                        

  <database-type id="ORACLERDB">
    <description>Oracle RDB</description>
    <classname>org.pentaho.di.core.database.OracleRDBDatabaseMeta</classname>
  </database-type>                                                           

  <database-type id="POSTGRESQL">
    <description>PostgreSQL</description>
    <classname>org.pentaho.di.core.database.PostgreSQLDatabaseMeta</classname>
  </database-type>

  <database-type id="REDSHIFT">
    <description>Redshift</description>
    <classname>org.pentaho.di.core.database.RedshiftDatabaseMeta</classname>
  </database-type>

  <database-type id="REMEDY-AR-SYSTEM">
    <description>Remedy Action Request System</description>
    <classname>org.pentaho.di.core.database.RemedyActionRequestSystemDatabaseMeta</classname>                                                                                           
  </database-type>                                                                          

  <database-type id="SAPDB">
    <description>MaxDB (SAP DB)</description>
    <classname>org.pentaho.di.core.database.SAPDBDatabaseMeta</classname>
  </database-type>                                                       

  <database-type id="SQLITE">
    <description>SQLite</description>
    <classname>org.pentaho.di.core.database.SQLiteDatabaseMeta</classname>
  </database-type>

  <database-type id="SYBASE">
    <description>Sybase</description>
    <classname>org.pentaho.di.core.database.SybaseDatabaseMeta</classname>
  </database-type>

  <database-type id="SYBASEIQ">
    <description>SybaseIQ</description>
    <classname>org.pentaho.di.core.database.SybaseIQDatabaseMeta</classname>
  </database-type>

  <database-type id="TERADATA">
    <description>Teradata</description>
    <classname>org.pentaho.di.core.database.TeradataDatabaseMeta</classname>
  </database-type>

  <database-type id="UNIVERSE">
    <description>UniVerse database</description>
    <classname>org.pentaho.di.core.database.UniVerseDatabaseMeta</classname>
  </database-type>

  <database-type id="VERTICA">
    <description>Vertica</description>
    <classname>org.pentaho.di.core.database.VerticaDatabaseMeta</classname>
  </database-type>
  
  <database-type id="VERTICA5">
    <description>Vertica 5+</description>
    <classname>org.pentaho.di.core.database.Vertica5DatabaseMeta</classname>
  </database-type>
 
 </database-types>

4.報錯解決:

hive的api的連線方式是通過jdbc連線的。具體思路與postgresql的連線思路相同。因此仿照postgresql的連線方法自定義一個database-type為hive2。因此,在kettle-core-7.1.0.0-12.jar的kettle-database-types.xml檔案中新增

   <database-type id="HIVE2">
     <description>HIVE2</description>
     <classname>org.pentaho.di.core.database.Hive2SQLDatabaseMeta</classname>
  </database-type>

其中會有對應的實現類org.pentaho.di.core.database.Hive2SQLDatabaseMeta。

仿照postgresql的實現類org.pentaho.di.core.database.PostgreSQLDatabaseMeta來寫即可,只需將連線的url以及jdbc driver名稱更換為hive相關。

將org.pentaho.di.core.database.Hive2SQLDatabaseMeta類編譯為.class檔案放進kettle-core-7.1.0.0-12.jar的org\pentaho\di\core\database路徑下

6.再次執行ktr:

執行成功:結果:

{/hivetorizhi (master)=1cc5f5bb-b0c4-4938-96f7-85363f88564d, /hivetorizhi (local:slave1-8081)=09ea3633-f5bd-4e43-9352-62b958255469, /hivetorizhi (local:slave2-8082)=35e00dc2-f570-4f26-8540-156fb84d9524}
/hivetorizhi (master)
/hivetorizhi (local:slave2-8082)
寫日誌

over

相關推薦

kettle-java程式碼執行hive相關ktr時報 database type with plugin id [HIVE2] couldn't be found!

1.在java專案中執行hive相關的ktr: KettleEnvironment.init(); TransMeta transMeta = new TransMeta("C:\\Users\\lixiang\\Desktop\\hivetorizhi.kt

kettle-java程式碼執行hive相關ktr時報 database type with plugin id [HIVE2] couldn't be found!

kettle-java程式碼執行hive相關ktr時報錯: database type with plugin id [HIVE2] couldn't be found! 轉 2018年08月13日 16:47:30 lisery1993 閱讀數:305 1.在jav

kettle-linux中執行hivektrError connecting to database: using class org.apache.hive.jdbc.HiveDriver

1.將windows環境中產生的ktr檔案上傳到linux環境中執行 [[email protected] data-integration]# ./pan.sh -file=/opt/cm/hadoop/pdi-ce-7.1.0.0-12/data-integr

HIVE的安裝配置、mysql的安裝、hive建立表、建立分割槽、修改表等內容、hive beeline使用、HIVE的四種資料匯入方式、使用Java程式碼執行hive的sql命令

1.上傳tar包 這裡我上傳的是apache-hive-1.2.1-bin.tar.gz 2.解壓 mkdir -p /home/tuzq/software/hive/ tar -zxvf apache-hive-1.2.1-bin.tar.gz  -C /home/

kettle-java程式碼執行轉換並獲取步驟度量

1.windows部署成功kettle並能成功執行。如果沒有成功詳見kettle windows部署方法 2.在eclipse中新建一個java專案或者maven專案,jdk配置為1.8. 3.將下圖中的jar包從kettle安裝路徑下lib目錄copy到專案中,並bui

java 連接 redis集群時報Could not get a resource from the pool

rom idt log 圖片 pool 本機ip redis style exce 由於弄這個的時候浪費了太多的時間,所以才記錄下這個錯,給大夥參考下 檢查了一下,配置啥的都沒問題的,但在redis集群機器上就可以,錯誤如下: Exception in thread "

安裝webpack後,執行webpack -v命令時報SyntaxError: Block-sc

nodejs webpack 排錯 版本不兼容 安裝webpack後,執行webpack -v命令時報錯如下: [root@FreeServer ~]# webpack -v /usr/local/node-v4.4.7-linux-x64/lib/node_modules/webpack/b

RIDE 接口自動化請求體參數中文時報“UnicodeDecodeError: 'ascii' codec can't decode byte 0xd7 in position 9......”

es2017 setting str bsp nic 時報 them tde python 在進行robotframework 接口自動化,在請求體參數中輸入中文會報以下錯誤: UnicodeDecodeError: ‘ascii‘ codec can‘t decode

python 讀取文件時報 UnicodeDecodeError: 'gbk' codec can't decode byte 0xa4 in position 127: illegal multibyte sequence

報錯 error: open 解決辦法 post code 讀取文件 utf-8 gpo UnicodeDecodeError: ‘gbk‘ codec can‘t decode byte 0xa4 in position 127: illegal multibyte se

PSU升級時報OPatch failed with error code 73

問題: 在執行補丁升級時,報下面的錯 /u01/app/oracle/grid/OPatch/opatch auto /soft/psu/gipsu/18706472/ -oh /u01/app/oracle/grid/ -ocmrf /u01/app/oracle/gri

解決kettle資料處理中報 Field [XXXXX] is required and couldn't be found! 錯誤

剛接觸kettle,在做資料處理時候一直提示Field [FIELD_ID] is required and couldn't be found!, FIELD_ID是標識欄位,通過自定義常量賦值,模型建立好後一直報錯 最後發現原因是核心物件使用錯誤,使用增加

【解決】自己編寫Wordcount程式碼上傳叢集上執行時報Exception in thread "main" java.lang.ClassNotFoundException: WordCount

報錯資訊:ClassNotFoundException: WordCount [[email protected] fs_testdir]# hadoop jar /fs_testdir/my

Java程式碼執行記憶體溢位詳解及解決方案

記憶體溢位與資料庫鎖表的問題,可以說是開發人員的噩夢,一般的程式異常,總是可以知道在什麼時候或是在什麼操作步驟上出現了異常,而且根據堆疊資訊也很容易定位到程式中是某處出現了問題。記憶體溢位與鎖表則不然,一般現象是操作一般時間後系統越來越慢,直到宕機,但並不能明確是在什麼操作上出現的,發生的時間

java程式碼執行shell命令

java程式碼執行shell命令,需要通過google出品的工具類ssxcute.jar,下載地址: https://download.csdn.net/download/qq_15076569/10797217 java程式碼操作shell程式碼: import net.neorem

通過java程式碼執行shell命令的小案列

一 需求描述: 通常我們操作linux伺服器都是通過遠端連線工具,比如:SecureCRT,Xmanager等工具,在其上輸入命令來完成一些工作,但是在實際工作中,有些時候需要我們通過java程式碼遠端連線到linux伺服器執行一些shell命令,包括叢集的狀

Android小知識-Java執行相關執行緒間通訊)下篇

本平臺的文章更新會有延遲,大家可以關注微信公眾號-顧林海,如果大家想獲取最新教程,請關注微信公眾號,謝謝! 在一種場景中,比如main執行緒建立並啟動了子執行緒,子執行緒需要做耗時操作,如果main執行緒想等子執行緒執行完成之後再結束,由於main執行緒比子執行緒先執行完,因此main執行緒獲取不

AttributeError: '_csv.reader' object has no attribute 'next' 我在使用pyhon3.4執行以下程式碼時報AttributeError:

我在使用pyhon3.4執行以下程式碼時報錯:AttributeError: '_csv.reader' object has no attribute 'next' 1 2 3 4 5 6 7 8

Java執行相關概念

前言 本篇文章介紹一些多執行緒的相關的深入概念。理解後對於執行緒的安全性會有更深的理解。 先說一個格言,摘自Java核心技術: 如果向一個變數寫入值,而這個變數接下來可能會被另一個執行緒讀取;或者一個變數讀值,而這個變數可能是之前被另一個執行緒寫入的,此時必須同步。 下面就是概念了。

Java面試執行相關面試題

1) 什麼是執行緒? 執行緒是作業系統能夠進行運算排程的最小單位,它被包含在程序之中,是程序中的實際運作單位。程式設計師可以通過它進行多處理器程式設計,你可以使用多執行緒對運算密集型任務提速。比如,如果一個執行緒完成一個任務要100毫秒,那麼用十個執行緒完成改任務只需10毫秒。Java在

Java效能優化30個小細節,提升Java程式碼執行效率

  程式碼優化,一個很重要的課題。可能有些人覺得沒用,一些細小的地方有什麼好修改的,改與不改對於程式碼的執行效率有什麼影響呢?這個問題我是這麼考慮的,就像大海里面的鯨魚一樣,它吃一條小蝦米有用嗎?沒用,但是,吃的小蝦米一多之後,鯨魚就被餵飽了。   程式碼優化也是一樣,如果專案著眼於儘