Stream: containers

Topic: how to configure robots.txt?


view this post on Zulip Ellen K (May 28 2024 at 12:20):

Hi, I'm running a test of inserting json-ld into the header of the SPA, and running into a problem because the robots.txt defined in the dataverse install disallows all access by default. Is there a way to configure the building of the dataverse image so that it uses a customized robots.txt? @Oliver Bertuch maybe something in configbaker?

view this post on Zulip Philip Durbin ๐Ÿš€ (May 28 2024 at 13:05):

I'm not aware of a way to do it today. For non-Docker installations we document how to override the robots.txt that comes in the war file.

@Ellen K please feel free to create an issue in the main repo.

view this post on Zulip Ellen K (May 28 2024 at 13:58):

Ok, I created an issue here: https://github.com/IQSS/dataverse/issues/10593

view this post on Zulip Philip Durbin ๐Ÿš€ (May 28 2024 at 15:25):

Thanks, I dragged it to the top of https://github.com/orgs/IQSS/projects/34/views/17 and added it to the next agenda at https://ct.gdcc.io

view this post on Zulip Philip Durbin ๐Ÿš€ (May 28 2024 at 15:26):

@Ellen K this for for your local development, right? https://beta.dataverse.org doesn't use containers.

view this post on Zulip Ellen K (May 28 2024 at 16:46):

Philip Durbin said:

Ellen K this for for your local development, right? https://beta.dataverse.org doesn't use containers.

ok, yes it is for beta.dataverse.org

view this post on Zulip Ellen K (May 28 2024 at 16:48):

So maybe we can modify the deployment


Last updated: Oct 30 2025 at 05:14 UTC