Preperation¶
Here some values are initialized that will be necessary for the other steps.
Set up¶
Import the requests module and assign the base url of the Dataspace as a variable.
from pprint import pprint
import requests
base_url = "https://vision-x-api.base-x-ecosystem.org"
Fill in Values¶
Fill in the values for the variables below.
Note that unlike the AmazonS3 example, here you can not provide the URL of your storage directly. Neither can you specify your password (Account Key). Instead the URL is a Connector wide setting which can only be configured when creating or editing a Connector. Additionally, instaed of a providing a password a SAS token is retrieved from the vault. For this you will need to manually generate a SAS token and put it into the vault as a JSON string.
An example SAS token as a JSON string to put into the vault is provided here
{
"edctype": "dataspaceconnector:azuretoken",
"sas": "se=2025-05-20T13%3A31%3A26Z&sp=w&sv=2025-05-05&sr=c&sig=ghRxJvgnvlnN96xzc1trTj4R4cJtaH7glNnjdCXE3iQ%3D"
}
# Your JWT recevied from Keycloak via vision-x-auth.base-x-ecosystem.org
token = "ey..."
token_header = {"Authorization": f"Bearer {token}"}
# The name of your Connector
connector_name = "my-connector"
# Your account in the Azure Storage
azure_account = "my-account"
# The secret in the vault where your SAS token for Azure Storage is stored
azure_sas = "my-sas-secret-name"
# The container where the file should be stored in
azure_container = "my-container"
# The DSP address of the Connector providing the offer
originator = "https://vision-x-api.base-x-ecosystem.org/connectors/alice/cp/protocol"
# The ID of the Agreement for the offer
agreement_id = "some-random-uuid"
Initiate Transfer¶
Here you will request for the transfer of the data to your storage.
Initiates the Transfer.
url = f"{base_url}/connectors/{connector_name}/cp/management/v3/transferprocesses"
payload = {
"@context": {
"odrl": "http://www.w3.org/ns/odrl/2/"
},
"counterPartyAddress": originator,
"contractId": agreement_id,
"transferType": "AmazonS3-PUSH",
"dataDestination": {
"type": "AmazonS3",
"container": azure_container,
"account": azure_account,
"keyName": azure_sas,
},
"protocol": "dataspace-protocol-http"
}
response = requests.post(url, json=payload, headers=token_header)
print(response.content)
response.raise_for_status()
transfer_id = response.json()["@id"]
print(f"Started Transfer with ID: {transfer_id}")
Confirms that the Transfer succeeded.
url = f"{base_url}/connectors/{connector_name}/cp/management/v3/transferprocesses/{transfer_id}"
response = requests.get(url, headers=token_header)
response.raise_for_status()
print(f"Transfer data:\n")
pprint(response.json())
If everything was successful the file should now be in your storage.