-
Notifications
You must be signed in to change notification settings - Fork 14.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to import datasources with encrypted_extra
params via REST
#31438
Comments
Hey @withnale! I'm here to help you with your Superset issue. I'm a bot designed to assist with bugs, questions, and contributions. Let's tackle this together! A similar issue was previously reported and closed. The issue was about importing a Snowflake database with the In your case, the failure when importing a BigQuery datasource with To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
To comment on the bot notes above. I've seen the |
@fisjac @betodealmeida wondering if you might have some ideas about this |
Ah, looks like we need to update the schema. I think the reasoning is that when we export the database we strip passwords and encrypted extra, so the import doesn't expect it. But people can add it manually, and the import should accept it and use it if present. I'll fix it, it should be quick. |
Does it make sense to ensure the same validations take place on the CLI and REST invocations? It seems strange that an object with encrypted_extra can sneak through the schema validation on the CLI version. |
Hello. Did this ever get addressed? |
I think it's OK to have a different validations for CLI and REST, since using the CLI implies direct access to the database. Because of this, allowing more powerful operations than the ones supported by the API makes sense, IMO. That said, I think the API should accept the encrypted extra, since it's not different than sending credentials when you create a DB. |
|
Bug description
At present it is possible to create a bigquery datasource which uses an explicit key by creating an
import_datasources.yaml
file that contains something similar to the block below:This can be imported using
superset import_datasources -p filename.yaml
from a running instance.However, if I wish to import the same data structure using the REST API
/api/v1/database/import
it will fail.It seems that the REST version applies additional schema checks on the bundled zipfile and fails because
encrypted_extra
is not part ofImportV1DatabaseSchema
Screenshots/recordings
No response
Superset version
master / latest-dev
Python version
3.11
Node version
16
Browser
Chrome
Additional context
No response
Checklist
The text was updated successfully, but these errors were encountered: