Elasticsearch寫一致性在5.x版本之後已經被廢棄掉
1 問題緣由
最近繼續在探索es的更多內容,查閱了相關資料(包括部落格文章跟其它一些教程),學習到寫一致性原理的相關知識,其本身並不難理解,但是一定要在實踐中經過驗證才會有更深的體會,就像在專案過使用過es做各種聚合統計搜尋分析,跟沒使用過,差別還是很大的。
於是在es 5.4的版本上進行測試:
PUT myblog/article/1?consistency=all { "title":"test" } { "error": { "root_cause": [ { "type": "illegal_argument_exception", "reason": "request [/myblog/article/1] contains unrecognized parameter: [consistency]" } ], "type": "illegal_argument_exception", "reason": "request [/myblog/article/1] contains unrecognized parameter: [consistency]" }, "status": 400 }
可以看到並不能識別consistency,然後我繼續使用5.2、5.6和老版本1.7進行測試,發現5.x的版本都是不行的,但是1.7的可以。所以想到很有可能寫一致性在es 5.x版本之後應該是已經廢棄掉的。
於是上網查詢一下資料,搜尋出來的結果,文件竟然幾乎都是來自同一份源文件,基於es 5.2的版本,但是卻還是把這個寫一致性帶上。
所以沒有辦法,只能查閱官方文件。
2 寫一致性在es 5.x版本已經廢棄掉
官方文件對於es 5.x的變化說明如下:
writeConsistencyLevel removed on write requestsedit
In previous versions of Elasticsearch, the various write requests had a setWriteConsistencyLevel method to set the shard consistency level for write operations. However, the semantics of write consistency were ambiguous as this is just a pre-operation check to ensure the specified number of shards were available before the operation commenced. The write consistency level did not guarantee that the data would be replicated to those number of copies by the time the operation finished. The setWriteConsistencyLevel method on these write requests has been changed to setWaitForActiveShards, which can take a numerical value up to the total number of shard copies or ActiveShardCount.ALL for all shard copies. The default is to just wait for the primary shard to be active before proceeding with the operation. See the section on wait for active shards for more details.
This change affects IndexRequest, IndexRequestBuilder, BulkRequest, BulkRequestBuilder, UpdateRequest, UpdateRequestBuilder, DeleteRequest, and DeleteRequestBuilder.
3 es 6.x?
後續在es 6.x是否還會有其它變化?
當然目前我不會去查閱es 6.x的相關資料,因為公司也只是剛從es 1.x 2.x升級到5.6,並且很大一部分專案還是使用舊的,所以目標很明確,後面更多是會關注es 5.x版本的,因為時間精力都是十分有限的。
就像現在spark已經是2.x了,但是由於公司使用的最新版本是1.6.3,所以對於spark,我個人的精力,包括spark的開發習慣,也都還是基於spark 1.6的。
當然,後續也肯定會把精力放到新的版本上。