關於ShallowEtagHeaderFilter 大檔案下載Out of memory問題解決
阿新 • • 發佈:2018-12-23
最近解決大檔案下載的問題,遇到一個"Out of memory"的exception。建廠controller層的程式碼,發現是用BufferdOutputStream寫入Response中的,緩衝區也只有8m,按理說不應該出現記憶體溢位的。
閱讀程式碼發現問題就出在這個過濾器中,這個過濾器中會將Buffered流轉換成ByteArray流寫入Response。而ByteArrayOutputStream都儲存記憶體中,還需要頻繁的擴容。這在大檔案下載的時候自然會記憶體溢位。
解決方案
考慮到大部分url還是需要該攔截器進行過濾的,只是需要排除掉跟檔案下載相關的url。所以這裡OneCoder決定複寫該Filter,設定一個黑名單,複寫其中的doFilterInternal方法,對於黑名單中的url都直接傳遞給下一個filter,否則super一下,繼續走原來的邏輯。
/**
* The filter is used for resolving the big file download problem when using
* {@link ShallowEtagHeaderFilter}. The urls on the black list will be passed
* directly to the next filter in the chain, the others will be filtered as
* before.
* <p>
* Sample:<br>
* {@code <filter>}<br>
* &nbsp&nbsp&nbsp {@code<filter-name>BigFileEtagFilter</filter-name>}<br>
* &nbsp&nbsp&nbsp
* {@code<filter-class>com.coderli.filter.BigFileDownloadEtagHeaderFilter</filter-class>}<br>
* &nbsp&nbsp&nbsp {@code<!-- url sperators includes: blank space ; , and /r/n.
* Black list is optional.>}<br>
* &nbsp&nbsp&nbsp {@code<init-param>}<br>
* &nbsp&nbsp&nbsp&nbsp&nbsp&nbsp {@code<param-name>blackListURL</param-name>}<br>
* &nbsp&nbsp&nbsp&nbsp&nbsp&nbsp
* {@code <param-value> /aa /bb/** /cc/* </param-value>}<br>
* &nbsp&nbsp&nbsp {@code</init-param>}<br>
* {@code</filter>}<br>
* {@code<filter-mapping>}<br>
* &nbsp&nbsp&nbsp {@code<filter-name>BigFileEtagFilter</filter-name>}<br>
* &nbsp&nbsp&nbsp {@code<url-pattern>/*</url-pattern>}<br>
* {@code</filter-mapping>}
*
* @author [email protected]
* @date 2014-9-12 9:46:38
*/
public class BigFileDownloadEtagHeaderFilter extends ShallowEtagHeaderFilter {
private final String[] NULL_STRING_ARRAY = new String[0];
private final String URL_SPLIT_PATTERN = "[, ;\r\n]";
private final PathMatcher pathMatcher = new AntPathMatcher();
private final Logger logger = LoggerFactory
.getLogger(BigFileDownloadEtagHeaderFilter.class);
// url while list
// private String[] whiteListURLs = null;
// url black list
private String[] blackListURLs = null;
@Override
public final void initFilterBean() {
initConfig();
}
@Override
protected void doFilterInternal(HttpServletRequest request,
HttpServletResponse response, FilterChain filterChain)
throws ServletException, IOException {
String reqUrl = request.getPathInfo();
if (isBlackURL(reqUrl)) {
logger.debug("Current url {} is on the black list.", reqUrl);
filterChain.doFilter(request, response);
} else {
super.doFilterInternal(request, response, filterChain);
}
}
private void initConfig() {
// No need white list now.
// String whiteListURLStr = getFilterConfig().getInitParameter(
// "whiteListURL");
// whiteListURLs = strToArray(whiteListURLStr);
String blackListURLStr = getFilterConfig().getInitParameter(
"blackListURL");
blackListURLs = strToArray(blackListURLStr);
}
// No need white list now.
// private boolean isWhiteURL(String currentURL) {
// for (String whiteURL : whiteListURLs) {
// if (pathMatcher.match(whiteURL, currentURL)) {
// logger.debug(
// "url filter : white url list matches : [{}] match [{}] continue",
// whiteURL, currentURL);
// return true;
// }
// logger.debug(
// "url filter : white url list not matches : [{}] match [{}]",
// whiteURL, currentURL);
// }
// return false;
// }
private boolean isBlackURL(String currentURL) {
for (String blackURL : blackListURLs) {
if (pathMatcher.match(blackURL, currentURL)) {
logger.debug(
"url filter : black url list matches : [{}] match [{}] break",
blackURL, currentURL);
return true;
}
logger.debug(
"url filter : black url list not matches : [{}] match [{}]",
blackURL, currentURL);
}
return false;
}
private String[] strToArray(String urlStr) {
if (urlStr == null) {
return NULL_STRING_ARRAY;
}
String[] urlArray = urlStr.split(URL_SPLIT_PATTERN);
List<String> urlList = new ArrayList<String>();
for (String url : urlArray) {
url = url.trim();
if (url.length() == 0) {
continue;
}
urlList.add(url);
}
return urlList.toArray(NULL_STRING_ARRAY);
}
}
關於ShallowEtagHeaderFilter這個“Bug”,OneCoder發現網上也有人向spring反應了,不過好像Spring方面認為這是使用問題,不作為bug來進行處理,那我們就自己解決一下。