Which two actions should you perform?
You have an Azure Cosmos DB Core (SQL) API account that is used by 10 web apps. You need to analyze the data stored in the account by using Apache Spark to create machine learning models. The solution must NOT…
You have an Azure Cosmos DB Core (SQL) API account that is used by 10 web apps. You need to analyze the data stored in the account by using Apache Spark to create machine learning models. The solution must NOT…
HOTSPOT You have an Azure Cosmos DB Core (SQL) API account named account1. In account1, you run the following query in a container that contains 100GB of data. SELECT * FROM c WHERE LOWER(c.categoryid) = “hockey” You view the following…
HOTSPOT You plan to deploy two Azure Cosmos DB Core (SQL) API accounts that will each contain a single database. The accounts will be configured as shown in the following table. How should you provision the containers within each account…
You have a database in an Azure Cosmos DB Core (SQL) API account. You need to create an Azure function that will access the database to retrieve records based on a variable named accountnumber. The solution must protect against SQL…
HOTSPOT You have an Azure Cosmos DB Core (SQL) API account used by an application named App1. You open the Insights pane for the account and see the following chart. Use the drop-down menus to select the answer choice that…
You have a database in an Azure Cosmos DB Core (SQL) API account. The database is backed up every two hours. You need to implement a solution that supports point-in-time restore. What should you do first? A. Enable Continuous Backup…
You have an Azure Cosmos DB Core (SQL) API account. You configure the diagnostic settings to send all log information to a Log Analytics workspace. You need to identify when the provisioned request units per second (RU/s) for resources within…
You have a container named container1 in an Azure Cosmos DB Core (SQL) API account. You need to provide a user named User1 with the ability to insert items into container1 by using role-based access control (RBAC). The solution must…
You are implementing an Azure Data Factory data flow that will use an Azure Cosmos DB (SQL API) sink to write a dataset. The data flow will use 2,000 Apache Spark partitions. You need to ensure that the ingestion from…
You need to configure an Apache Kafka instance to ingest data from an Azure Cosmos DB Core (SQL) API account. The data from a container named telemetry must be added to a Kafka topic named iot. The solution must store…