I have been asking in the Google Groups community about customizing the new metadata block (this is the "https://groups.google.com/g/dataverse-community/c/VFMoX9G-D60" thread). I want to create a selected element as "authorIdentifierScheme" in the citation block. I share with you my .tvs file and my .properties file. Furthermore, I got help from Julian, but what he told me to do didn't work.
lineaInvProperties.png
lineaInvtvs.png
This is like my two fields is showing in my dataverse installation.
dataverseLineaInv.png
This is how I want it to look.
option.png
Someone know how can I do it?, thanks
You say you shared the .tsv already? Can you please upload it here?
lineaInv.properties
this is the .properties file
Thanks!
@Philip Durbin have you seen something wrong in my files ?
Are you able to load that file? I get an error:
dev_dataverse> java.lang.ArrayIndexOutOfBoundsException: Index -1 out of bounds for length 15
dev_dataverse> at java.base/java.util.Arrays$ArrayList.get(Unknown Source)
dev_dataverse> at edu.harvard.iq.dataverse.api.DatasetFieldServiceApi.getArrayIndexOutOfBoundMessage(DatasetFieldServiceApi.java:349)
dev_dataverse> at edu.harvard.iq.dataverse.api.DatasetFieldServiceApi.loadDatasetFields(DatasetFieldServiceApi.java:294)
I am doing on a regular dataverse install, not a developer install. So when I index solr I don't get any errors.
Sorry, I mean I get that error when I try to load it like this:
curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file lineaInv.tvs
I run the command while I have been viewing the logs and I get this error
[2023-09-26T13:29:26.968-0500] [Payara 5.2022.3] [SEVERE] [] [edu.harvard.iq.dataverse.api.errorhandlers.ThrowableHandler] [tid: _ThreadID=75 _ThreadName=http-thread-pool::http-listener-1(3)] [timeMillis: 1695752966968] [levelValue: 1000] [[
_status="ERROR";_code=500;_message="Internal server error. More details available at the server logs.";_incidentId="9720d5b8-7689-4954-a2e0-85626a0be975";_interalError="ArrayIndexOutOfBoundsException";_requestUrl="http://localhost:8080/api/v1/admin/datasetfield/load";_requestMethod="POST"|]]
[2023-09-26T13:29:26.968-0500] [Payara 5.2022.3] [SEVERE] [] [edu.harvard.iq.dataverse.api.errorhandlers.ThrowableHandler] [tid: _ThreadID=75 _ThreadName=http-thread-pool::http-listener-1(3)] [timeMillis: 1695752966968] [levelValue: 1000] [[
java.lang.ArrayIndexOutOfBoundsException
]]
[2023-09-26T13:29:26.968-0500] [Payara 5.2022.3] [SEVERE] [] [] [tid: _ThreadID=75 _ThreadName=http-thread-pool::http-listener-1(3)] [timeMillis: 1695752966968] [levelValue: 1000] [[
java.lang.ArrayIndexOutOfBoundsException
]]
Yes, exactly. :big_smile:
It's ok. We'll fix it. But I'm a little busy right now. :sweat_smile:
Is there an error in the tvs file?
Yes, there must be.
We don't have a validator for tsv files. We should.
I forget, are you comfortable adding some logger.log lines to DatasetFieldServiceApi and recompiling the code? Because that's what we'll probably need to do to debug the tsv file.
yes, I'm comfortable
How can I do it?
Can you please try adding this?
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java b/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
index 00b7dfa6e3..f3421fb626 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
@@ -243,8 +243,10 @@ public class DatasetFieldServiceApi extends AbstractApiBean {
try {
br = new BufferedReader(new FileReader("/" + file));
while ((line = br.readLine()) != null) {
+ logger.log(Level.INFO, "line: " + line);
lineNumber++;
values = line.split(splitBy);
+ logger.log(Level.INFO, "values: " + values);
if (values[0].startsWith("#")) { // Header row
switch (values[0]) {
case "#metadataBlock":
ok
Oh, sorry, I'm just showing you the output of git diff.
I'm hoping you can add those two logger.log lines to the code, recompile, redeploy, reload the tsv, and look at server.log.
Yes, I was confused.
Philip Durbin ha dicho:
Oh, sorry, I'm just showing you the output of
git diff.
Philip Durbin ha dicho:
I'm hoping you can add those two
logger.loglines to the code, recompile, redeploy, reload the tsv, and look at server.log.
what I need to add?
Do you see those two logger.log lines? With the +?
ok ok hahahaha
I'm so tired my brain doesn't work at all hahaha.
me too :sweat_smile:
I am recompiling with "mvn clean package"
image.png
I got this error
You'll need to make a change like this:
diff --git a/pom.xml b/pom.xml
index 7ba22d2a07..353682353f 100644
--- a/pom.xml
+++ b/pom.xml
@@ -305,7 +305,7 @@
<dependency>
<groupId>org.dataverse</groupId>
<artifactId>unf</artifactId>
- <version>6.0</version>
+ <version>6.0.1</version>
</dependency>
<!-- Rosuda Rengine and Rserve, packaged by org.nuiton.thirdparty -->
<!-- TODO: see if there's another, better maintained maven repository for Rosuda libraries? - L.A. -->
More background on that UNF problem: https://groups.google.com/g/dataverse-community/c/YQgMEpSX360/m/GUqvuJDzAQAJ
Philip Durbin said:
We don't have a validator for tsv files. We should.
Once upon a time I started working on gdcc/mdbtool exactly because of this.
Yes. #containers > hacking on metadata blocks
Hi, I replied in the Google group. Looks like the metadataBlock "name" is the same as one of the dataset field "name" in the block. All "names" need to be unique. Julian confirmed that with this: https://guides.dataverse.org/en/latest/admin/metadatacustomization.html#metadatablock-properties
Here's where the names are the same - linea_investigacion
Screenshot-2023-09-26-at-4.39.54-PM.png
Oh ho! Thanks, @Sherry Lake! I obviously missed that reply of yours!
Sherry Lake ha dicho:
Hi, I replied in the Google group. Looks like the metadataBlock "name" is the same as one of the dataset field "name" in the block. All "names" need to be unique. Julian confirmed that with this: https://guides.dataverse.org/en/latest/admin/metadatacustomization.html#metadatablock-properties
Here's where the names are the same - linea_investigacion
Screenshot-2023-09-26-at-4.39.54-PM.png
Yeah, Sherry thanks, I change it, and it's the same don't show like a dropdown
Is the problem the extra line breaks in the middle?
@Philip Durbin I have reinstalled dataverse with the changes you said to add, what is the next step?
I'm trying too.
upload the tvs file and see the logs?
Generally, yes.
But I don't like that the tsv file gets into the database when there are errors. :rage:
Does the file have to be ".tsv", not ".tvs"?
Does the dropdown box display? If so, is there nothing in the box?
Another thing to try. Remove the properties file (or at least rename it if you don't want to remove it. If the properties file does not exist (matching the metadata block) then the text from the database will be what is displayed.
One more thing - to try - remove the contents of the "Identifier" field in the controlled vocab section.
Here is what I used to create a new metadata block with controlled vocabulary:
Screenshot-2023-09-26-at-4.52.34-PM.png
If there's anything wrong with the tsv file I would like Dataverse to reject it. However, at least part of the tsv file is getting into the database. This is a bug. :lady_beetle:
We did talk about this some place else and how a commit/rollback would help... Not sure where that was...
And didn't someone want to open an issue?
In gray are the blank lines I'm wondering about. I don't know if they cause a problem or not:
Screen-Shot-2023-09-26-at-5.02.54-PM.png
Philip Durbin ha dicho:
In gray are the blank lines I'm wondering about. I don't know if they cause a problem or not:
yes @Philip Durbin that was the trouble
I have finished indexing solr with the tvs file uploaded and that fixes the error.
Thank you very much @Philip Durbin @Oliver Bertuch @Sherry Lake
:tada:
Never give up! :grinning:
Great job, @Santiago Florez !
Philip Durbin has marked this topic as resolved.
This topic was moved here from #community > โ Custom metadata block by Philip Durbin.
Last updated: Oct 30 2025 at 06:21 UTC