How do I delete a LARGE dataset (over 11,000 files)?
It is still in draft and used this command:
curl -H "X-Dataverse-key: $API_TOKEN" -X DELETE "$SERVER_URL/api/datasets/$ID"
and got this timeout message:
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>503 Service Unavailable</title>
</head><body>
<h1>Service Unavailable</h1>
<p>The server is temporarily unable to service your
request due to maintenance downtime or capacity
problems. Please try again later.</p>
</body></html>
Is there a way in the db to remove it? But then there are the 11,000+ files to remove?
Advice???
First, this sounds like a bug so please feel free to create an issue.
What if you delete some of the files first? Maybe that would help?
I finally was able to see the dataset in the UI, it was timing out on me just to display.
Oh, I was off by a few 1,000..... the dataset now has 14,838 files.
It is very slow..... and still is timing out.... but I'll try to delete some files first.
Sounds good. Perhaps deleting files via API will be more reliable.
I see you posted to the google group as well. Good idea. :sweat_smile:
Even using the API to delete a (one) file, stalls the system.
Whoa!
That's definitely a bug.
See also #troubleshooting > Limit number of files per dataset
Last updated: Oct 30 2025 at 06:21 UTC