I noticed that Harvard Dataverse limits 1TB per researcher. I'm wondering how that is enforced. I havent found anything in the documentation yet that covers limiting the dataset size.
I don't think we have a great way of enforcing it. However, now that Dataverse 6.1 has collection-level quotas, we plan to roll them out to Harvard Dataverse: https://github.com/IQSS/dataverse.harvard.edu/issues/240
Thank you. I still have to get UCLA up to 6.1.
So does Harvard. Hopefully upgrading soon. :grinning:
Harvard has the 1 TB limit now so it it enforced manually? Looking at what's coming and and dealing with it then?
Yeah, manually.
Just curious how many production installations have implemented collection storage quotas?
we're starting to look at this at Berkeley, as we're working on a pilot to host faculty datasets on our dataverse instance. i can see we don't have a way to define a quota per user, but we could per collection.
does anyone have insight into how we could run regular reports about usage per user?
I believe this is where we're tracking that feature request:
Extend Storage Quotas to individual datasets and user accounts #11529
Please feel free to :thumbs_up: and leave a comment!
thanks @Philip Durbin 🚀 - i'm also interested to know if there are existing know ways for us to run reports to get a sense of usage on a per-user basis.
@maría a. matienzo you should probably ask at https://groups.google.com/g/dataverse-community
I took a look at https://guides.dataverse.org/en/6.7.1/admin/reporting-tools-and-queries.html but I don't see anything.
https://groups.google.com/g/dataverse-community/c/MsOaWFq96wk/m/cU2N5RVDAQAJ looks great. Thanks.
Last updated: Nov 01 2025 at 14:11 UTC