I have a feeling this will need to go to Jim (and I will ask this in the Google group too, for him to see) - getting an error when using DVUploader.
First file is the dvuploader command and ensuing error message:
dvuploader_cmd_error.txt
Second file is the section for this job in the server logs:
error_log.txt
So......
What happened?
FYI:
MaxFileUploadSizeInBytes. = 107374182400
This file = 15616921658 bytes
Where are the files? On S3?
Should we delete them and start over?
OR since they are on S3, how can we get the dataset to know them?
Yes, thanks for posting at https://groups.google.com/g/dataverse-community/c/J7zdhcAC4bs/m/36_tPhM2AgAJ as well.
For my part, I'm looking at the line that's throwing the error: https://github.com/IQSS/dataverse/blob/v6.2/src/main/java/edu/harvard/iq/dataverse/datasetutility/AddReplaceFileHelper.java#L2148
at edu.harvard.iq.dataverse.datasetutility.AddReplaceFileHelper.addFiles(AddReplaceFileHelper.java:2148)
When we see errors like this...
"jakarta.ejb.EJBException: One or more Bean Validation constraints were violated while executing Automatic Bean Validation on callback event: prePersist for class: edu.harvard.iq.dataverse.DatasetFieldValue. Please refer to the embedded constraint violations for details."
... it often means that values were saved in the database and then later the rules were made more restrictive.
But I'm not seeing anything at https://dataverse.lib.virginia.edu/dataset.xhtml?persistentId=doi:10.18130/V3/ESQBIY that's suspicious.
Do you think you'd be able to easily replicate this on https://demo.dataverse.org ?
This dataset had the "bounding box" metadata (after upgrading 6.? became "incomplete metadata"). So the info in that metadata field was removed - in the latest version.
But I am not sure if that metadata field was "fixed" and version saved BEFORE the files were uploaded.
Philip Durbin โ๏ธ said:
Do you think you'd be able to easily replicate this on https://demo.dataverse.org ?
What is the MaxFileSizeLimit on demo.dataverse?
And demo.dataverse doesn't use S3 direct upload?
So could would trying to replicate work?
I see. Hmm. I'm wondering if it would be best to republish just the fix for the bounding box.
And then, once we clear that hurdle, try try to upload the files.
Also, if this is definitely the bounding box, I'd say it's a bug that Dataverse produces such a cryptic error.
Maybe the problem wouldn't have occurred if we had fixed the bounding box first, but the 70+GB files were already in our S3bucket (after we checked per Jim's advice)
https://groups.google.com/g/dataverse-community/c/J7zdhcAC4bs
Luckily, the fix was to run the API "add file" command using the instructions here:
https://guides.dataverse.org/en/latest/developers/s3-direct-upload-api.html#adding-the-uploaded-file-to-the-dataset
And since we had the DVUploader output , we had all the parameters for the JSON file.
All files accounted for:
https://doi.org/10.18130/V3/ESQBIY
I'm glad you fixed it! ![]()
Sherry Lake has marked this topic as resolved.
Last updated: Oct 30 2025 at 06:21 UTC