Opendatasoft is now Huwise! 🎉
Ask questions, get answers, and engage with your peers
Don't know where to start?
Soon
Documentation and other useful resources
APIs documentations, templates, and widgets
We often want to do a metadata only update via the Backoffice UI.For example we want to update a date field, press save and then publish. When we do this for some reason ODS then processes and refreshes the dataset which at times can take over 30 minutes to process and publish (and then may have problems with doing this due to at timeout).
Hi All,We currently have a document library on our website, however we have had feedback from customers that its not easy to navigate.As a result we are exploring the possibility of using our Open Data Portal as a home of a new “Technical Library” containing 200+ pdfs etc of supporting documents. Now it seems that after about 80 additional attachments I cant add anymore. As a DNO, we are very cautious around data security so essentially I need a solution where the documents can be stored on the Portal itself in an ideal world, because using options like github, sharepoint etc arent really viable as getting permission would likely be a dead end.Anyone got any ideas or examples of similar resources? Potential to spilt the attachments over multiple datasets but that becomes a bit of a nightmare to maintain/policeThanksRyan
After 14 years, our brand is evolving! New name, new colors, new identity… but the same solution and the same mission: helping you turn your data into a source of knowledge, innovation, and performance. A new name that reflects our visionHuwise = Human + Wise: enhancing human thinking through data. New identity, same platformDon’t worry - the platform works exactly the same as before. Only a few visual elements have changed to reflect our new identity: the logo, the name in the portal footer, login screens, etc. Learn more Watch the video from our co-founders and check out our FAQ to get all the details about the change!
As there is still missing a button that hides a public dataset from catalog, we’re trying to do a workaround.For that, we would like to incorporate a public dataset of our secondary workspace, which isn’t publicly promoted, into a dashboard/code-editor’s page of our primary workspace. Has someone tried something similar? Has it worked out?My first thought was to do it via “domain” attribute of “ods-dataset-context” widget, but that didn’t work out: <ods-dataset-context context="test" test-dataset="test-index-dataset" test-domain="https://myseconddomain.opendatasoft.com/" ></ods-dataset-context>Thanks for your ideas :)
When we create a dataset from our FTP server we currently use SFTP. (Documentation: https://userguide.opendatasoft.com/l/en/article/jpp5wvvkkf-creating-a-dataset-from-a-remote-source-url-api-ftp). We now would like to set up an “FTP with meta CSV harvester”. However, this harvester only supports FTP or FTPS (Documentation: https://userguide.opendatasoft.com/l/en/article/wsyubsjp1m-ftp-with-meta-csv-harvester).It would be great if we could also use SFTP for FTP Harvesters to keep our setup consistent and clean.Auto-translation 🪄Lorsque nous créons un ensemble de données à partir de notre serveur FTP, nous utilisons actuellement SFTP. (Documentation : https://userguide.opendatasoft.com/l/en/article/jpp5wvvkkf-creating-a-dataset-from-a-remote-source-url-api-ftp). Nous aimerions maintenant mettre en place un « FTP avec moissonneur méta CSV ». Cependant, ce moissonneur ne prend en charge que FTP ou FTPS (Documentation : https://userguide.opendatasoft.com/l/en/article/wsyubsjp1m-ftp-with-me
Hi, I’m Kamal Hinduja , Switzerland(Swiss). Can anyone explain me in details Are there limits on API call rates or data retrieval sizes? Thanks
I am kindly requesting air quality datasets with variables PM2.5, PM10, O₃, NO₂, SO₂, etc.) across many locations in Africa.
In a few days, you'll be able to perform joins on very large datasets with no size limit. 🎉 👉 To help you prepare for this change, you’ll receive an email with the list of affected datasets and the actions to take. Stay tuned!
What is the Automation API?The Automation API streamlines the back-office management of an Opendatasoft portal by automating over 150 actions—from data publication to domain and user permissions management—without using the interface. New endpoints to manage your metadata translationsAutomatically set translations for your metadata with the new Automation API endpoints. No need to re-enter them manually if they already exist in your information system tools or within the portal. The result: significant time savings and simplified multilingual management.Try it nowIf your plan includes the Automation API, you can now explore the updated technical documentation to try it out:Translation suggestions Code editor pages translations Studio pages translations Dataset translationsLearn moreDiscover real-world use cases and the benefits of the Automation API in this dedicated article Contact your Customer Success Manager or sales representative
Is there a way to add metadata and dataset schema translations (e.g. in english or italian) via the Automation API?
Hi All,Is there any definitions for the possible “Actions” in the portal api monitoring? Ie what defines a “download” etc Thanks Ryan
Hello everybody,I have 2 maps in my dashboard (2 different years): I would like to have a shared “location” between those two.For example, if I zoom on the left one, I want the right one to be zoomed also.Is it possible ?Thanks in advance
Hi everyone,I’m Dario Schiraldi currently working on a project where I need to join multiple datasets, and I’d love to hear your suggestions on the best practices for performing data joins. Specifically, I’m interested in methods for both SQL and Python.I’d appreciate any insights, tips, or resources you have! Thanks in advance!RegardsDario Schiraldi CEO of Travel Works
Does anyone have any tips on how to shape data in our portals to work better with the AI tools? The concern from our leadership is someone will ask a question and if the data contains too many filterable items it could return an incorrect result.Is there any guidelines on how to better shape the data to make it easier for AI to understand and provide results?
Are there plans to add more mapping capability for Studio Maps - for example clusters, dots and shapes and heat maps - same functionality that is shown in Map Builder?
As I’m trying to create a page with the code editor, I’m facing an issue with the refinement of datasets.To automate a line chart for different contexts, so that I can select one dataset of a list of many datasets and use the same ods-chart tag, I want to refine all the contexts before referencing on them in the chart widget. But the important aspect of my question is this: For this refinement, it would be more convenient for me to define the values that should be excluded, instead of the ones that should be included.Is there a possibility to do so?Thank you all in advance! :)
Need to discuss a more specific issue? Contact our support team and we’ll be happy to help you get up and running!
Already have an account? Login
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.