Data Cloud Development - Chapter 2: Accessing Data Cloud Data Via Api (Postman)
Accessing Data Cloud Data Via Api (Postman)

Data Cloud Development - Chapter 2: Accessing Data Cloud Data Via Api (Postman)

Article Moved

This article has been moved to Medium.com. I have left the original article here, but any updates or comments will be done only on the Medium site. Thanks for your understanding and support.


Original Article

TL;DR; This article describes how to get data from Data Cloud via APIs using Postman. The steps involved include creating a connected app in the Data Cloud org and using those credentials in Postman to query the data. This is the second chapter of a series of articles for Developers working with Data Cloud

This is the second chapter of a series of articles for developers who want to work with Data Cloud, this is the second chapter of this series. Check out the previous articles here:

Today we are using Postman to help us make the callouts to the Data Cloud org, but obviously, you could use any other tool as needed like NodeJs, Python, Java, PHP, or any other language of your choice. Most likely you will use Postman during the POC phase of your project and troubleshoot when things are not working correctly.

Before we start geeking out, let's set some ground rules:

  • Data Cloud LimitsPlease review the API Limits for Profile, Query, and Calculated Insights which are in addition to the Data Cloud Limits and Guidelines
  • Push vs. PullThis article is going to show how to pull data from Data Cloud, and it has some limits as mentioned above, please consider pushing data with the help of Activations and Data Actions. Also, there may be some easier way to push data using flows to publish Platform Events.

I'm glad you are still here... that probably means that you are still interested in connecting to Data Cloud using an API, and although we are only going to be querying data using the APIs, there is so much more that can be done with the Data Cloud APIs (Metadata API, Calculated Insights API, Query API, Ingestion API, Profile API, ...). There are very good samples of those requests in the Salesforce Postman collection for Data Cloud.

Let's review the four steps that we need to take in order to query data from Data Cloud:

  • Create Connected App In Data Cloud org, you need to create a Connected App that will allow using oAuth to log into the Data Cloud org.
  • Obtain Data Cloud Org Access Token Once we have the connected app, we can use an OAuth flow to authorize Postman (or any API client) to access the org's resources.
  • Obtain Data Cloud Server Access Token Once you have authorized Postman and obtained the DC Org access token, then you will need to swap it for a Data Cloud Server access token.
  • Query Data Cloud Once we have the Data Cloud Server access token, we can use it to query data from the Data Cloud org.

Create Connected App

The first step we need to take is to create a Connected App that will provide the Consumer Key and Consumer Secret needed for the OAuth flows. For that you go to the Data Cloud org setup menu, find App Manager, and create a new connected app with these characteristics:

Callback URL

Scopes required:

  • Manage user data via APIs (API)
  • Perform requests at any time (refresh_token, offline_access)
  • Perform ANSI SQL queries on Data Cloud data (cdp_query_api)
  • Access all Data Cloud API resources (cdp_api)

Additional scopes (optional):

  • Full access (full)
  • Manage Data Cloud Ingestion API data (cdp_ingest_api)
  • Manage Data Cloud profile data (cdp_profile_api)

You can add more settings to the Connected App to make it more secure on this page or manage additional settings on the Manage page. But since this is a demo, we'll stop right there :-)

Once the Connected App is created, you need those special values (Consumer Key, Consumer Secret) which can be obtained by clicking the Manage Consumer Details button and entering a Verification Code sent via email. Write those down in a secure location.

Configuring the Connected App
WARNING, this is only for testing! For the simplicity of this article, we are going to start by using the very unsafe OAuth 2.0 Username-Password Flow. Please DO NOT USE THIS IN PRODUCTION.

Once we prove it works, we will remove the ability to use this flow and we will use one of the more secure OAuth flows.

So, to get that flow working, we need to enable it by going to Setup > Identity > OAuth and OpenID Connect Settings and turning on the Allow OAuth Username-Password Flows toggle button. DO NOT forget to toggle this back later!

Obtain Data Cloud Org Access Token

Now, it's time to go to Postman and obtain the Data Cloud Org access token using the OAuth 2.0 Username-Password Flow (did I say this was not safe ??).

Navigate to https://www.postman.com/, login, and open a workspace (or create a new one if needed). Once there, create a new collection. In this collection, create a new request with these characteristics:

HTTP Verb:

  • POST

URL:

Body:

  • Type: x-www-form-urlencoded
  • The body should have these four keys:

grant_type: password
client_id: {{consumerKey}}
client_secret: {{consumerSecret}}
username: {{username}}
password: {{password}}        

The request/response looks like this...

Request / Response for getting Data Cloud Org access token

Keep the values retrieved handy, because we will use them for the next step. A nice, easy way of doing that is having Postman save those for you on variables. You may have seen in the image above that I have created some variables to store the consumer Key, Consumer Secret, Username, and Password.

We can write this test script to have Postman save those values for you. Click on the test tab and paste this code

pm.test("response is ok", function () {
    // Assert status is 200
    pm.response.to.have.status(200);
    
    const context = pm.environment.name ? pm.environment : pm.collectionVariables;
    const body = pm.response.json();
    context.set("_orgInstanceUrl", body.instance_url);
    context.set("_orgAccessToken", body.access_token);
})        

Obtain Data Cloud Server Access Token

We could use the access token and instance URL to talk to Salesforce like we have done many times in the past, but the server that runs Data Cloud is different, so we need to log in to that server

One easy way of doing that is to swap the Data Cloud Org access token for the Data Cloud Server access token. This process is documented in the Data Cloud Reference Guide.

Back to Postman, you need to create a request with these characteristics:

HTTP Verb:

  • POST

URL:

Headers:

  • We must provide a header, which is similar to any other REST API call to Salesforce providing the access token.
  • Authorization: Bearer {{_orgAccessToken}}

Authorization: Bearer {{_orgAccessToken}}

Body:

  • Type: x-www-form-urlencoded
  • The body should have these three keys:

grant_type: urn:salesforce:grant-type:external:cdp
subject_token: {{_orgAccessToken}}
subject_token_type: urn:ietf:params:oauth:token-type:access_token        

The request/response looks like this...

Request / Response for getting Data Cloud Server access token

Let's save those values into variables, like we did before. Click on the test tab and paste this code:

pm.test("response is ok", function () {
    // Assert status is 200
    pm.response.to.have.status(200);
    
    const context = pm.environment.name ? pm.environment : pm.collectionVariables;
    const body = pm.response.json();
    context.set("_dcInstanceUrl", `https://${body.instance_url}`);
    context.set("_dcAccessToken", body.access_token);
})        

Query Data Cloud

Fantastic, we are now ready for the next and last step... querying the data from Data Cloud. that will be accomplished with a simple call to the Data Cloud Query API.

Back to Postman, you need to create a request with these characteristics:

HTTP Verb:

  • POST

URL:

Headers:

  • We must provide a header, which is similar to any other REST API call to Salesforce providing the access token
  • But this time, we are using the Data Cloud Server access token
  • Authorization: Bearer {{_dcAccessToken}}

Authorization: Bearer {{_dcAccessToken}}

Body:

  • Type: raw
  • The body in JSON format contains the SQL query you are making

{
    "sql": "SELECT AccountId__c, Id__c, Name__c, Email__c, MailingStreet__c, MailingCity__c, MailingState__c, MailingPostalCode__c, MailingCountry__c FROM Contact_00DHs000003U30g__dll ORDER BY Id__c ASC LIMIT 100"
}        

The request/response looks like this...

Request / Response for getting Data Cloud data

Replace With Secure OAuth Flow

Pretty nice, but there is one thing that I am not quite comfortable with, and neither should you. We need to use a secure OAuth flow, rather than the username-password one used earlier.

There is one option for doing this and that is what Salesforce documents in the Salesforce Postman collection for Data Cloud and the Data Cloud Reference Guide.

But we are going to use a different mechanism, which could be simpler than JWT and still secure.

Postman allows us to configure the authentication process, and we are going to do just that. Inside the folder where you have been storing all the previous requests, create a new folder named Secured. Then on that folder click on the Authorization tab, and select OAuth2.0

Create a new folder for authorization, then set it to OAuth 2.0

Validate these settings are configured:

Type:

  • OAuth 2.0

Add auth data to:

  • Request Headers

Scroll down to the Configure New Token section, and make these changes:

Token Name:

  • access_token

Grant type

  • Implicit
  • This is the OAuth flow that we will use. We will be discussing this further below.

Callback URL (disabled)

Auth URL

Client ID

  • {{consumerKey}}
  • I am using a variable that is set in the folder or a parent folder. In this case the variable is set on the parent folder ;-)

Scope

  • api cdp_api cdp_query_api
  • These are the scopes required by Data Cloud.
  • We can't use the refresh_token in this case, because the flow we are using does not support that.

Client Authentication

  • Send as Basic Auth header

The screen should look like this:

Authentication Screen configured for Data Cloud access

Let's review the grant type. Salesforce has multiple OAuth flows available, we saw the OAuth 2.0 Username-Password Flow earlier. We have a few more, all documented in the OAuth Authorization Flows page, but Salesforce uses slightly different names than Postman, for example

Postman Grant Types determine the OAuth flow

Since we are on the topic of OAuth flows, keep in mind that not all OAuth flows support all the scopes. Although we set the refresh_token and offline_access in the connected app, the flow we are using in Postman (User-Agent/Implicit) does not support that scope. If you need to review the available flows, please check out the recording of a webinar I did back in March 2022 on OAuth flows.

Now that we have the authentication configured, let's use it. First, we are going to make a simple REST API call to Salesforce, just like a ping, to check the limits usage.

First, we need to use the authentication we just configured to get an access token. So scroll down and click on the Get New Access Token at the bottom of the page, then follow the steps in this video

Steps to get a new access token

Now that we have an access token stored, let's test it. Create a new GET request to this URL: {{_orgInstanceUrl}}/services/data/v59.0/limits where {{_orgInstanceURL}} is a variable we set earlier with the instance URL returned when we logged into Salesforce. Place the request in the new folder, and set the authorization to Inherit auth from parent.

REST API tester

So the authentication works fine, let's now see if we can swap the Data Cloud Org access token for the Data Cloud Server access token. We'll do the same step we did before by creating a POST request with this body:

grant_type: urn:salesforce:grant-type:external:cdp
subject_token: {{_orgAccessToken}}
subject_token_type: urn:ietf:params:oauth:token-type:access_token        

But if you pay attention, we have a small problem. How can we access the DC Org access token that was retrieved by Postman? We need that to be placed in the header (which is done automatically by Postman) but also in the body.

We need to ask Postman for that value, fortunately, Postman gives us that after a successful request, like checking for the limits. We just need to ask for it. So the test tab of the limits request contains this script that requests the value and stores it in a custom variable named _SecuredToken.

const context = pm.environment.name ? pm.environment : pm.collectionVariables;
const token = pm.request.auth.oauth2.get('accessToken');
console.log(token);
context.set("_SecuredToken", token);        

So we need to change the body of our request to this:

grant_type: urn:salesforce:grant-type:external:cdp
subject_token: {{_SecuredToken}}
subject_token_type: urn:ietf:params:oauth:token-type:access_token        

Leave the authorization for the Query request to be No auth, otherwise, Postman would add the access token from the Data Cloud Org, but we need the Data Cloud Server. The request is similar, except for the subject_token to the one we discussed before working with the Postman authorization settings.

Today we saw how to use Data Cloud APIs with Postman to retrieve data from Data Cloud. Part of this process is swapping Data Cloud Org access token for the Data Cloud Server access token. This new access token will help us not only retrieve data, because we should be doing push not pulls, but it will also help with the other APIs as well.

Deva Subramani

Solutions Architect at AZ

3 个月

Andres Perez Calculated Insight "CI Name" does not exist in data space context default. I was not able to query from other dataspaces except default

回复
Arthur Backouche

Salesforce Marketing Cloud Champion | Senior Consultant | x13 Certified | Sydney Group Leader

10 个月

Hi Andres Perez, the SQL Query always timeout. Even when having a LIMIT at 1. Is your article up-to-date?

回复

Great article. I wrote one (with a video) on the MuleSoft to Data Cloud link, showing how to invoke the Data Cloud APIs through the Mule connector. https://medium.com/@dsoffner/unlocking-salesforce-data-cloud-with-mulesofts-anypoint-platform-56b3705416ad

The interactive nature of your work is commendable. What has been the most requested topic so far, and why do you think it's in such high demand?

要查看或添加评论,请登录

Andres Perez的更多文章

社区洞察

其他会员也浏览了