This is a placeholder page. Replace it with your own content.

Using AppConfig for Application Validation

Configuring aws-cli

:::{admonition} Stay Organized with MFA :class: tip

Be careful to not use the Root Account for AWS Account ID = :::

Databricks Accredited Platform Administrator

1. Databricks Accounts

An organization is migrating to Databricks. The data team’s leader wants to understand which assets and content will be located in the organization’s cloud account and which assets and content will be located in Databricks’ cloud account. Which of the following will be located in the organization’s cloud account? Select two responses.

2. Databricks Accounts for Use-cases

A data team is training a new administrator for the Databricks Lakehouse Platform. The new administrator has learned about three key Databricks services: the Data Science and Engineering Workspace, Databricks SQL, and Databricks Machine Learning. The administrator is concerned about the challenges of governing the data assets in each of these services simultaneously across the organization’s three deployed workspaces. Which of the following solutions can be used to address the new administrator’s concerns? Select one response.

3. Security

A Databricks platform administrator is learning to secure the Databricks Lakehouse Platform. They need to know which objects are securable within the Databricks security model.

Which of the following objects are securable? Select two responses.

4. IAM

A Databricks platform administrator wants to improve the organization’s identity management within Databricks. The administrator is tired of copying and pasting privileges across groups. They currently have four groups with the following identities:

Which of the following approaches can be used to simplify the identity management to address the administrator’s concern as much as possible? Select one response.

5 IAM Levels

A Databricks platform administrator is determining whether to use account-level identities or workspace-level identities. They need to understand the differences between two types of identities. Which of the following is true of account-level identities but is not true of workspace-level identities? Select two responses.

6. IAM Levels and Groups

A Databricks account administrator would like to assign existing account-level groups and users to a newly created workspace. However, the account administrator is unable to complete the task. Which of the following must occur within the new workspace prior to assigning existing groups and users? Select one response.

7. Account Groups

A Databricks account administrator is becoming too busy to manage the administration of an organization’s account by themselves. They would like to add another account administrator. The new account administrator is already a user in the account. Which of the following approaches can be used to convert an existing account user into an account administrator? Select two responses.

8. Databricks Regions

A platform administrator wants to configure Databricks SQL for the regions in which the users are located. Each region uses its own workspace. In which of the following ways can the administrator localize Databricks SQL from the Databricks SQL Admin Console? Select two responses.

9. Databricks Envionments (SQL)

A platform administrator has set up a workspace and it is ready for use within the Data Science & Engineering workspace and Databricks Machine Learning. However, the administrator is still in the process of configuring Databricks SQL. The administrator would like to allow users to use the workspace while preventing access to Databricks SQL until it is properly configured. Which of the following approaches can the administrator use to restrict access to Databricks SQL? Select one response.

10. Secrets

A data engineering team is using Databricks secrets to securely authenticate to upstream REST APIs. Which of the following commands can be used to read the secrets in a notebook? Select one response.

11. Cluster Configuration

A Databricks platform administrator is wanting to create a set of preconfigured clusters for users to use during their interactive data work. The admin does not want the users to be able to change the cluster’s configuration, but users should be able to terminate and start the cluster based on whether or not somebody is using it. Which of the following cluster-level permissions should the platform admin assign to the users? Select one response.

12. Libraries

A data science team wants to use a Python package that is not included in the Databricks Runtime or Databricks Runtime for Machine Learning, but they are tired of installing the package at the top of each notebook. They plan on using this package in nearly every workload. They want to ensure that their solution works into the extended future. Which of the following approaches can the data science team use to meet their requirements? Select one response.

13. Databricks SQL

A team of business analysts is using Databricks SQL for their analytics workloads. They have become aware of the Serverless SQL warehouses in Databricks SQL, and they would like to try and use them. In which of the following ways do Serverless Databricks SQL warehouses differ from Classic Databricks SQL warehouses? Select two responses.

14. Analytics Workloads

A data analysis organization is using Databricks SQL for their analytics workloads. Each team within the organization has their own SQL warehouse. One of the larger teams is complaining that their queries take too long to complete when a large number of the team members are simultaneously working. Which of the following approaches can be used to improve runtimes in this situation? Select one response.

15. BI Data Analysts

A data analyst team wants to use Tableau to further process data that is transformed and manipulated within Databricks SQL. The team’s platform administrator knows that this integration can be performed using Partner Connect. Which of the following approaches does Databricks recommend for authenticating this integration with Partner Connect? Select one response.

16. Unity Catalog

A data organization is using a Unity Catalog-enabled Databricks workspace to complete its work. A workspace administrator is doing an audit of all files on the system, and they are going to start by looking through the DBFS root locations. Which of the following can the workspace administrator expect to find in the subdirectories of the DBFS root folder of this Databricks workspace? Select two responses.

17. Unity Catalog

A platform administrator wants to create a new metastore to be shared across workspaces that will be enabled for Unity Catalog. Which of the following steps does the platform administrator need to accomplish as a part of creating the new metastore? Select one response.

18.Unity Catalog

A platform administrator is reviewing Unity Catalog and attempting to make a decision about whether or not the organization should enable its workspaces for Unity Catalog. Each department within the organization will have its own metastore. The administrator is concerned about the recommendation to create managed tables without specifying a location, because they do not want their data stored in the DBFS root because it is accessible to all users in a workspace. Which of the following statements is a valid explanation on why the administrator should not worry about this concern? Select one response.

19. Unity Catalog

A platform administrator is onboarding their data teams onto Unity Catalog, and the data engineers have questions about the Unity Catalog data object hierarchy. Which of the following lists the Unity Catalog data objects in order of largest to smallest? Select one response.

20. Unitiy Catalog

A platform administrator is attempting to secure access to their organization’s data after enabling Unity Catalog. They want to know which privileges are available to be granted for users for each object. Which of the following privileges are available to be granted for tables? Select two responses.

21. Unity Catalog

A user owned a series of objects within a Unity Catalog metastore, but that user is leaving the organization. The objects now need to have their ownership transferred to a different user. Which of the following can be used to change the ownership of the objects to the new user? Select two responses.

22. GitHub Integration

A team of data engineers is onboarding onto Databricks and they would like to use their GitHub repositories within Databricks Repos. Which of the following approaches can each user follow to configure their GitHub account to work with Databricks Repos? Select one response.

23. Workflows

A data engineering team is looking to automate their Databricks workloads using Workflows. They would like to create a Job with a single task as their first workflow. Because this Job will regularly run, they would like to keep compute costs to a minimum. Which of the following steps does the platform administrator need to take to help the team minimize compute costs for their Workflows? Select two responses.

24. Notifications

A data analytics team has set up a scheduled query to run every day at noon. The team has also configured an alert for the query. When the alert is triggered, the user that created the alert gets an email. However, they would like the entire team (including those that do not use Databricks) to be notified. Which of the following approaches can the platform administrator use so that the entire team can be notified by Databricks when the alert is triggered? Select two responses.

25. Multiple Choice: Secret Scopes

A platform administrator has added a secret scope new-secret-scope to integrate with upstream REST APIs used by the entire data organization. The administrator now wants to restrict access on that secret scope so that all users in the group all-users can read the secrets but cannot write the secret. Which of the following approaches can be used by the platform administrator to complete this task? Select two responses.

AWS Platform Administrator

1. One

Which three elements must be created within Databricks prior to creating a Databricks workspace in a custom VPC? (Choose three answers)

2. Two

Which two statements are true of encryption key configurations? (Choose two answers)

3 Three

Which element do you have to create when integrating Databricks to an AWS-managed service?

4. Four

What is true of an external location?

It requires account admin privileges to create.

It provides the ability to control access to a portion of an external storage bucket.

It requires metastore admin privileges to create.

It can only provide access control to an entire external storage bucket. Single Choice

5. Five

What are the subnet requirements for each GCP Databricks workspace?

Two private subnets

Two public subnets

Two public subnets and two private subnets

One public subnet and one private subnet Multiple Choice

6. Six

What are two prerequisites for creating an AWS Databricks metastore? (Choose two answers)

BucketCredential configurationIAM role with appropriate permissionsStorage configuration Multiple Choice

7. Seven

Which two authentication schemes are supported for querying a list of workspaces programmatically? (Choose two answers)

Credentials passthroughBasicOauthPAT Single Choice

8. Eight

Where are storage credentials created?

Workspace/Admin console

Account console/Data page

SQL

Workspace/Data explorer Single Choice

9. Nine

Who is the trusted principal when configuring the permissions policy associated with an external storage bucket?

Static Unity Catalog IAM role

IAM role used for accessing the metastore

IAM role that is generated when creating the storage credential

IAM role used for provisioning the workspace Single Choice

10. Ten

What mechanism enables Databricks clusters to access AWS-managed services?

Access token

VPC peering

Instance profile

Databricks account settings Single Choice

11. Eleven

Where is regionality determined when setting up workspaces in your own VPC?

Workspace

VPC

Subnet

IP address ranges Multiple Choice

12. Twelve

Which three statements describe the relationship between workspaces and metastores? (Choose three answers)

Metastores and workspaces must be in the same cloud region.A workspace can have many metastores assigned.A metastore can be assigned to many workspaces.There can be only one metastore in a cloud region.There can only be one workspace in a cloud region. Multiple Choice

13. Thirteen

Which two tasks are part of the general pattern involved in connecting Databricks to AWS-managed services? (Choose two answers)

Define a custom policy with permissions to perform operations with the serviceCreate a suitably privileged IAM roleEnable or configure the service in your accountEnable access to the service in the account consoleCreate a storage credential that references the service Multiple Choice

14. Fourteen

Which two elements must be created within Databricks prior to creating a Databricks workspace? (Choose two answers)

Credential configurationNetwork configurationStorage configurationMetastore Single Choice

15. Fifteen

Which API is used to create a workspace?

Unity Catalog

SCIM

Account

Workspace Multiple Choice

16. Sixteen

Which three elements can encryption key configurations be applied to? (Choose three answers)

System storage bucketCluster disk volumesRoot storage bucketExternal storage bucketMetastore bucket Single Choice

17. Seventeen

What are the workspace bucket requirements?

One for system and DBFS

Three: one for system, one for DBFS, and one for metastore

Workspaces can share a bucket

Two: one for system and one for DBFS Single Choice

18. Eighteen

What mechanism registers a VPC into your Databricks account?

Credential configuration

VPC endpoint

Network configuration

Storage configuration Multiple Choice

19. Nineteen

What are the two minimum requirements to access external storage? (Choose two answers)

Storage configurationStorage credentialCross account IAM role with appropriate permissions to access the bucketInstance profile attached to the cluster running the workloadsExternal location Multiple Choice

20. Twenty

Which two elements must be created within AWS prior to creating a Databricks workspace? (Choose two answers)

BucketTwo subnetsCross account IAM role with appropriate permissionsVPC