Hi all, I hope you're doing well,
just have one question regarding the metadata block number, is there a limit to how many metadata can we have ? cauz in our case we are testing with a large number (currently at 42 with many fields) and we are hitting a header too large issue from dataverse, I tried many of the workarounds:
You've changed requestHeaderSize already (docs)? To what value?
Do the logs indicated how big the header is?
I've set it to 1024000 in the solr container:
Could not hpack encode GET{u=http://solr:8983/solr/collection1/select?q=*.....
ps :
I deployed dataverse with Docker, and im using version v6.6 for dataverse and 9.8.0 for solr.
Ok, so 100 times the number we recommend in the docs. That should be enough!
Do you want to go ahead and open an issue at https://github.com/IQSS/dataverse/issues ?
thank you Philip for the replies.
even setting that to a larger value couldn't solve the issue, the only work around I could find was removing some of the older blocks,
I just create an issue for the error.
best.
oussama.
Thanks! I edited the issue to add a link to this topic in Zulip.
Thanks, Philip. I just have one additional general question:
If the problem with this header is caused by the large number of searchable items, then logically, limiting or decreasing the number should make the header smaller or make it work. However, it didnโt work for me (unless I did it incorrectly).
Right, but how are you decreasing the number of searchable items? I'm pretty sure that number is determined by how many fields are available in the database (that were added by loading a metadata block tsv file, I mean).
... which is probably a bug
Last updated: Jan 09 2026 at 14:18 UTC