1. 程式人生 > >Hadoop審計日誌配置[轉自 AIMP平臺wiki]

Hadoop審計日誌配置[轉自 AIMP平臺wiki]

Hadoop審計日誌配置

  • hdfs審計
    log4j.properties 中配置(預設就包含)
    hdfs.audit.logger=INFO,NullAppender
    hdfs.audit.log.maxfilesize=256MB
    hdfs.audit.log.maxbackupindex=20
    log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=${hdfs.audit.logger}
    log4j.additivity.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=false   (是否同時寫到namenode日誌中)
    #log4j.logger.org.apache.hadoop.security=DEBUG,RFAAUDIT (會有部分認證授權資訊)
    log4j.appender.RFAAUDIT=org.apache.log4j.RollingFileAppender
    log4j.appender.RFAAUDIT.File=${hadoop.log.dir}/hdfs-audit.log
    log4j.appender.RFAAUDIT.layout=org.apache.log4j.PatternLayout
    log4j.appender.RFAAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n
    log4j.appender.RFAAUDIT.MaxFileSize=${hdfs.audit.log.maxfilesize}
    log4j.appender.RFAAUDIT.MaxBackupIndex=${hdfs.audit.log.maxbackupindex}
    

修改hadoop-env.sh中 HADOOP_NAMENODE_OPTS中hdfs.audit.logger配置

export HADOOP_NAMENODE_OPTS=.... -Dhdfs.audit.logger=${HDFS_AUDIT_LOGGER:-INFO,RFAAUDIT} $HADOOP_NAMENODE_OPTS

日誌輸出namenode主機logs/hdfs-audit.log
日誌格式如下:

2014-04-30 10:19:13,173 INFO FSNamesystem.audit: allowed=true    ugi=cdh5 (auth:SIMPLE)    ip=/10.1.251.52    cmd=create    src=/a._COPYING_    dst=null    perm=cdh5:supergroup:rw-r--r--

ugi <user>,<group>[,<group>]*
ip <client ip address>
cmd (open|create|delete|rename|mkdirs|listStatus|setReplication|setOwner|setPermission)
src <path>
dst (<path>|"null")
perm (<user>:<group>:<perm mask>|"null")
  • MapReduce審計
    log4j.properties配置,預設沒有log4j.logger.org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger=${mapred.audit.logger},手工增加:
    mapred.audit.logger=INFO,MRAUDIT
    mapred.audit.log.maxfilesize=256MB
    mapred.audit.log.maxbackupindex=20
    log4j.logger.org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger=${mapred.audit.logger}
    log4j.logger.org.apache.hadoop.mapred.AuditLogger=${mapred.audit.logger}
    log4j.additivity.org.apache.hadoop.mapred.AuditLogger=false
    log4j.appender.MRAUDIT=org.apache.log4j.RollingFileAppender
    log4j.appender.MRAUDIT.File=${hadoop.log.dir}/mapred-audit.log
    log4j.appender.MRAUDIT.layout=org.apache.log4j.PatternLayout
    log4j.appender.MRAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c{2}: %m%n
    log4j.appender.MRAUDIT.MaxFileSize=${mapred.audit.log.maxfilesize}
    log4j.appender.MRAUDIT.MaxBackupIndex=${mapred.audit.log.maxbackupindex}
    

日誌輸出在resourcemanager主機的logs/mapred-audit.log

格式如下:

2014-04-30 10:35:09,595 INFO resourcemanager.RMAuditLogger: USER=cdh5    IP=10.1.251.52    OPERATION=Submit Application RequestTARGET=ClientRMService    RESULT=SUCCESS    APPID=application_1398825288110_0001
  • hive審計

元資料查詢修改審計:
conf/hive-log4j.properties

log4j.appender.HAUDIT=org.apache.log4j.DailyRollingFileAppender
log4j.appender.HAUDIT.File=${hive.log.dir}/hive_audit.log
log4j.appender.HAUDIT.DatePattern=.yyyy-MM-dd
log4j.appender.HAUDIT.layout=org.apache.log4j.PatternLayout
log4j.appender.HAUDIT.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} (%F:%M(%L)) - %m%n
log4j.logger.org.apache.hadoop.hive.metastore.HiveMetaStore.audit=INFO,HAUDIT

日誌檔案:logs/hive_audit.log

日誌格式:

2014-04-30 11:26:09,918 INFO  HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5    ip=unknown-ip-addr      cmd=get_database: default
2014-04-30 11:26:09,931 INFO  HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5    ip=unknown-ip-addr      cmd=get_tables: db=default pat=.*
2014-04-30 11:26:45,153 INFO  HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5    ip=unknown-ip-addr      cmd=get_table : db=default tbl=abc
2014-04-30 11:26:45,253 INFO  HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5    ip=unknown-ip-addr      cmd=get_table : db=default tbl=abc
2014-04-30 11:26:45,285 INFO  HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5    ip=unknown-ip-addr      cmd=get_table : db=default tbl=abc
2014-04-30 11:26:45,315 INFO  HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(242)) - ugi=cdh5    ip=unknown-ip-addr      cmd=drop_table : db=default tbl=abc

  • hbase security

1)開啟hbase security
在所有節點hbase-site.xml中加入以下內容重啟,其中指定rpc engine為SecurepcEngine,因為該引擎能傳遞remote client傳遞的使用者憑證(如使用者名稱)

    <property>
          <name>hbase.rpc.engine</name>
          <value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
     </property>
     <property>
          <name>hbase.coprocessor.master.classes</name>
         <value>org.apache.hadoop.hbase.security.access.AccessController</value>
     </property>
     <property>
          <name>hbase.coprocessor.region.classes</name>
          <value>org.apache.hadoop.hbase.security.token.TokenProvider,org.apache.hadoop.hbase.security.access.AccessController</value>
    </property>
<property>
   <name>hbase.superuser</name>
   <value>superuser-accout:cdh5</value>       <!--指定superuser -->
</property>

2)配置log4j.properties,開啟Security audit appender,預設有

hbase.security.log.file=SecurityAuth.audit
hbase.security.log.maxfilesize=256MB
hbase.security.log.maxbackupindex=20
log4j.appender.RFAS=org.apache.log4j.RollingFileAppender
log4j.appender.RFAS.File=${hbase.log.dir}/${hbase.security.log.file}
log4j.appender.RFAS.MaxFileSize=${hbase.security.log.maxfilesize}
log4j.appender.RFAS.MaxBackupIndex=${hbase.security.log.maxbackupindex}
log4j.appender.RFAS.layout=org.apache.log4j.PatternLayout
log4j.appender.RFAS.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
log4j.category.SecurityLogger=${hbase.security.logger}
log4j.additivity.SecurityLogger=true
log4j.logger.SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController=TRACE

日誌格式如下:

2014-06-10 16:09:53,319 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: Table permission granted; remote address: /10.1.251.152; request: deleteTable; context: (user=cdh5, scope=yqhtt, family=, action=ADMIN)
2014-06-10 16:09:53,356 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: All users allowed; remote address: /10.1.251.152; request: getClosestRowBefore; context: (user=cdh5, scope=hbase:meta, family=info, action=READ)
2014-06-10 16:09:53,403 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: Table permission granted; remote address: /10.1.251.152; request: delete; context: (user=cdh5, scope=hbase:meta, family=info:, action=WRITE)
2014-06-10 16:09:53,444 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: Table permission granted; remote address: /10.1.251.152; request: delete; context: (user=cdh5, scope=hbase:acl, family=l:, action=WRITE)
2014-06-10 16:09:53,471 TRACE SecurityLogger.org.apache.hadoop.hbase.security.access.AccessController: Access allowed for user cdh5; reason: All users allowed; remote address: /10.1.251.152; request: getClosestRowBefore; context: (user=cdh5, scope=hbase:meta, family=info, action=READ)

Updated by 楊 啟虎 , 
Access count: 16 :since 2012-04-07