opengrok
opengrok copied to clipboard
OutofMemoryError: unable to create new native thread
I created a new index with rc9, and I had a problem, here was the error: SEVERE: Problem updating lucene index database: java.lang.OutOfMemoryError: unable to create new native thread
it happened a few times in my run, here is another example: Feb 08, 2017 8:12:01 AM org.opensolaris.opengrok.index.IndexDatabase$2 run SEVERE: Problem updating lucene index database: java.lang.OutOfMemoryError: unable to create new native thread at java.lang.Thread.start0(Native Method) at java.lang.Thread.start(Thread.java:714) at org.opensolaris.opengrok.util.Executor.exec(Executor.java:193) at org.opensolaris.opengrok.history.GitHistoryParser.parse(GitHistoryParser.java:161) at org.opensolaris.opengrok.history.GitRepository.getHistory(GitRepository.java:512) at org.opensolaris.opengrok.history.GitRepository.getHistory(GitRepository.java:505) at org.opensolaris.opengrok.history.FileHistoryCache.get(FileHistoryCache.java:499) at org.opensolaris.opengrok.history.HistoryGuru.getHistory(HistoryGuru.java:242) at org.opensolaris.opengrok.history.HistoryGuru.getHistory(HistoryGuru.java:202) at org.opensolaris.opengrok.history.HistoryGuru.getHistoryReader(HistoryGuru.java:186) at org.opensolaris.opengrok.analysis.AnalyzerGuru.populateDocument(AnalyzerGuru.java:334) at org.opensolaris.opengrok.index.IndexDatabase.addFile(IndexDatabase.java:622) at org.opensolaris.opengrok.index.IndexDatabase.indexDown(IndexDatabase.java:883) at org.opensolaris.opengrok.index.IndexDatabase.indexDown(IndexDatabase.java:848) at org.opensolaris.opengrok.index.IndexDatabase.indexDown(IndexDatabase.java:848) at org.opensolaris.opengrok.index.IndexDatabase.indexDown(IndexDatabase.java:848) at org.opensolaris.opengrok.index.IndexDatabase.indexDown(IndexDatabase.java:848) at org.opensolaris.opengrok.index.IndexDatabase.update(IndexDatabase.java:397) at org.opensolaris.opengrok.index.IndexDatabase$2.run(IndexDatabase.java:184) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)
Is there a setting I had to change, or is this a new bug with this version. please advise. thanks.
Do you use Java heap tuning ? In our deployment we run the indexer with
JAVA_OPTS="-d64 -XX:-UseGCOverheadLimit -Xmx16384m -server
also if indexer is run on linux, you might hit restrictions on number of open processes - nproc. or max open files limit check /etc/security
hmm. but has something significant changed since rc7? it is a linux box, and these machines are pretty big.. and my JAVA_OPTS is set to this: JAVA_OPTS='-d64 -Xms32g -Xmx32g'
also, current max file limit is 5000000 and here is my ulimit info
$ ulimit -a
core file size (blocks, -c) unlimited
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 1031257
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 8192
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 32768
cpu time (seconds, -t) unlimited
max user processes (-u) 4000
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
This does not seem to be caused by memory shorage as much as thread limits.
https://blog.fastthread.io/2016/07/06/troubleshoot-outofmemoryerror-unable-to-create-new-native-thread/ and http://stackoverflow.com/questions/16789288/java-lang-outofmemoryerror-unable-to-create-new-native-thread mention some ideas, in particular ulimit -u
.
I wonder if this is related to the recently added support for renamed files in git which forks more external processes than previously.
It would be useful to gather process dump when the exception occurs so that the number of existing threads can be seen.
I don't know when this occurs, it's during the indexing process (actually happens a few times). Is there anything I can do to help capture that? or can you reproduce somehow on your side?
Can you please disable handling renamed files and see if that's the problem? It's this option in configuration. You can set it through the read-only configuration.
There was a change since rc7 in handling renamed files for git.
Thanks for feedback. I am pretty new to grok source code, I suspect you want me to overwrite it at
Configuration.java
public void setHandleHistoryOfRenamedFiles(boolean enable) {
this.handleHistoryOfRenamedFiles = enable;
}
or is there an option to pass it at command line...?
Sorry there's no a command line option. Of course you can change the code and compile it but that's not necessary. This is the way:
- create an empty read-only configuration
You can skip this step as you can see the only thing which differs is the java version. If you know yours you can just continue with step 2 and replace it.
- Depends on where you are - locate the Group script (if you cloned the git then it's in tools directory)
- Run it:
OPENGROK_INSTANCE_BASE="x" tools/Groups add group group
The output should be like:
<?xml version="1.0" encoding="UTF-8"?>
<java version="1.8.0_72" class="java.beans.XMLDecoder">
<object class="org.opensolaris.opengrok.configuration.Configuration" id="Configuration0">
<void property="groups">
<void method="add">
<object class="org.opensolaris.opengrok.configuration.Group">
<void property="name">
<string>group</string>
</void>
<void property="pattern">
<string>group</string>
</void>
</object>
</void>
</void>
</object>
</java>
- now replace the content about groups with
<?xml version="1.0" encoding="UTF-8"?>
<java version="1.8.0_72" class="java.beans.XMLDecoder">
<object class="org.opensolaris.opengrok.configuration.Configuration" id="Configuration0">
<void property="handleHistoryOfRenamedFiles"> <!-- this is the renamed history option -->
<boolean>false</boolean> <!-- this is the value -->
</void>
</object>
</java>
-
save this into a file, let's say
/var/opengrok/etc/read-only.xml
-
now run (or next time) the indexer with
- With the shell wrapper:
READ_XML_CONFIGURATION=/var/opengrok/etc/read-only.xml ./OpenGrok index
- Or directly with opengrok jar file:
java -jar opengrok.jar -R /var/opengrok/etc/read-only.xml
By the way this is the way to configure all opengrok features which are contained in the Configuration.java class and don't have command line option.
I have tested opengrok rc8 with disabled handleHistoryOfRenamedFiles and got the same performance as with rc7. So at least on my system the "git --rename" is the killer.
Thanks for your support. I know now how to disable the feature and can work with the latest versions..
` opengrok 13 rc7: ... Feb 14, 2017 5:49:25 AM org.opensolaris.opengrok.util.Executor exec FINE: Finished command [/usr/bin/git, log, --abbrev-commit, --abbrev=8, --name-only, --pretty=fuller, --date=rfc, zlib/zutil.h] in directory /devel/git/panos-orig Feb 14, 2017 5:49:25 AM org.opensolaris.opengrok.index.DefaultIndexChangedListener fileAdded FINER: Added: /panos-orig/zlib/zutil.h (CAnalyzer) Feb 14, 2017 5:49:26 AM org.opensolaris.opengrok.index.IndexDatabase optimize INFO: Optimizing the index ... Feb 14, 2017 5:49:26 AM org.opensolaris.opengrok.analysis.Ctags$1 run WARNING: Error from ctags: ctags: Warning: Language "pascal" already defined
Feb 14, 2017 5:50:19 AM org.opensolaris.opengrok.index.IndexDatabase optimize INFO: done Feb 14, 2017 5:50:21 AM org.opensolaris.opengrok.util.Statistics report INFO: Done indexing data of all repositories (took 1:06:04) Feb 14, 2017 5:50:21 AM org.opensolaris.opengrok.util.Statistics report INFO: Total time: 1:06:12 Feb 14, 2017 5:50:21 AM org.opensolaris.opengrok.util.Statistics report INFO: Final Memory: 40M/2,044M
- UpdateDescriptionCache
- '[' -n /devel/opengrok-data-sros/stage2/etc/paths.tsv -a -s /devel/opengrok-data-sros/stage2/etc/paths.tsv ']'
- /home/siegfrih/local/jdk1.8.0_60-x64/bin/java -classpath /home/siegfrih/local/opengrok-0.13-rc7/lib//opengrok.jar org.opensolaris.opengrok.web.EftarFile /devel/opengrok-data-sros/stage2/etc/paths.tsv /devel/opengrok-data-sros/stage2/data/index/dtags.eftar
real 66m20.424s user 108m20.307s sys 13m14.230s
opengrok 13 rc8 rename patch: ...
INFO: Progress: 62,104 (100%) Feb 14, 2017 7:31:04 AM org.opensolaris.opengrok.index.DefaultIndexChangedListener fileAdd INFO: Add: /panos-orig/zlib/zutil.h (CAnalyzer) Feb 14, 2017 7:31:04 AM org.opensolaris.opengrok.util.Executor exec FINE: Executing command [/usr/bin/git, log, --abbrev-commit, --abbrev=8, --name-only, --pretty=fuller, --date=rfc, zlib/zutil.h] in directory /devel/git/panos-orig Feb 14, 2017 7:31:04 AM org.opensolaris.opengrok.util.Executor exec FINE: Finished command [/usr/bin/git, log, --abbrev-commit, --abbrev=8, --name-only, --pretty=fuller, --date=rfc, zlib/zutil.h] in directory /devel/git/panos-orig Feb 14, 2017 7:31:04 AM org.opensolaris.opengrok.index.DefaultIndexChangedListener fileAdded FINER: Added: /panos-orig/zlib/zutil.h (CAnalyzer) Feb 14, 2017 7:31:05 AM org.opensolaris.opengrok.index.IndexDatabase optimize INFO: Optimizing the index ... Feb 14, 2017 7:31:05 AM org.opensolaris.opengrok.analysis.Ctags$1 run WARNING: Error from ctags: ctags: Warning: Language "pascal" already defined
Feb 14, 2017 7:31:58 AM org.opensolaris.opengrok.index.IndexDatabase optimize INFO: done Feb 14, 2017 7:31:59 AM org.opensolaris.opengrok.util.Statistics report INFO: Done indexing data of all repositories (took 1:06:01) Feb 14, 2017 7:31:59 AM org.opensolaris.opengrok.util.Statistics report INFO: Total time: 1:06:08 Feb 14, 2017 7:31:59 AM org.opensolaris.opengrok.util.Statistics report INFO: Final Memory: 40M/2,038M
- UpdateDescriptionCache
- '[' -n /devel/opengrok-data-sros/stage2/etc/paths.tsv -a -s /devel/opengrok-data-sros/stage2/etc/paths.tsv ']'
- /home/siegfrih/local/jdk1.8.0_60-x64/bin/java -classpath /home/siegfrih/local/opengrok-0.13-rc7/lib//opengrok.jar org.opensolaris.opengrok.web.EftarFile /devel/opengrok-data-sros/stage2/etc/paths.tsv /devel/opengrok-data-sros/stage2/data/index/dtags.eftar
real 66m14.478s user 108m2.622s sys 13m9.119s
`
I'm sorry but I don't see the difference
Feb 14, 2017 5:50:21 AM org.opensolaris.opengrok.util.Statistics report
INFO: Total time: 1:06:12
Feb 14, 2017 5:50:21 AM org.opensolaris.opengrok.util.Statistics report
INFO: Final Memory: 40M/2,044M
real 66m20.424s
user 108m20.307s
sys 13m14.230s
vs.
Feb 14, 2017 7:31:59 AM org.opensolaris.opengrok.util.Statistics report
INFO: Total time: 1:06:08
Feb 14, 2017 7:31:59 AM org.opensolaris.opengrok.util.Statistics report
INFO: Final Memory: 40M/2,038M
real 66m14.478s
user 108m2.622s
sys 13m9.119s
..exactly, I applied your config patch directly in the source code, compiled it and it worked as before..
real 66m20.424s vs. real 66m14.478s
=>~6 seconds diff..
running index with this new change. will post results here.
still failing with same errors. and with this patch Krystof gave me, I have loads of warnings now. Is there going to be a fix for this?
from your ulimit: max user processes (-u) 4000
might play the role in your case
can you try to increase that to more ? https://access.redhat.com/solutions/406663
@tulinkry also mentioned that timer threads created by executors could be a problem - ifeally timers should be shared.
yes. noticed that. will try.
I think I have the same issue. After upgrading to latest (from 113f53c2e261ff598952089dde85b5a20a955380)
There was an error!
Java heap space
java.lang.OutOfMemoryError: Java heap space
at org.apache.lucene.analysis.charfilter.HTMLStripCharFilter.zzUnpackCMap(HTMLStripCharFilter.java:30072)
at org.apache.lucene.analysis.charfilter.HTMLStripCharFilter.<clinit>(HTMLStripCharFilter.java:232)
at org.opensolaris.opengrok.search.Results.getTags(Results.java:105)
at org.opensolaris.opengrok.search.Results.prettyPrint(Results.java:227)
at org.apache.jsp.search_jsp._jspService(search_jsp.java:809)
at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:438)
at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:396)
at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:340)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.opensolaris.opengrok.web.StatisticsFilter.doFilter(StatisticsFilter.java:55)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.opensolaris.opengrok.web.AuthorizationFilter.doFilter(AuthorizationFilter.java:66)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:522)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1095)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
It seems to occur for specific searches , Not sure what is pattern yet (maybe searches that the results are in big files? in c/c++?)
@amotzte that looks like different problem
When inspecting the "out of memory errors" it is necessary to consider at least these things:
- whether this is a webapp or indexer (in your case it is webapp, in the original report it is indexer)
- specifics of the exception - in the original report there is a clear sign of threads space being exhausted (as opposed of general heap memory space)
- the stack trace matching - there are no similarities whatsoever. The problem in the original report happened when parsing Git history while yours seems to be specific to displaying search results in the webapp.
It would be nice if you can debug this further (attaching debugger remotely to Tomcat is possible). The question is whether the input data fed to getTags()
caused the problem or the memory was already low and the function was rather a victim of the situation.
Thanks @vladak for the replay will try to debug a bit further